88 research outputs found

    Evidential-EM Algorithm Applied to Progressively Censored Observations

    Get PDF
    Evidential-EM (E2M) algorithm is an effective approach for computing maximum likelihood estimations under finite mixture models, especially when there is uncertain information about data. In this paper we present an extension of the E2M method in a particular case of incom-plete data, where the loss of information is due to both mixture models and censored observations. The prior uncertain information is expressed by belief functions, while the pseudo-likelihood function is derived based on imprecise observations and prior knowledge. Then E2M method is evoked to maximize the generalized likelihood function to obtain the optimal estimation of parameters. Numerical examples show that the proposed method could effectively integrate the uncertain prior infor-mation with the current imprecise knowledge conveyed by the observed data

    Belief Hierarchical Clustering

    Get PDF
    In the data mining field many clustering methods have been proposed, yet standard versions do not take into account uncertain databases. This paper deals with a new approach to cluster uncertain data by using a hierarchical clustering defined within the belief function framework. The main objective of the belief hierarchical clustering is to allow an object to belong to one or several clusters. To each belonging, a degree of belief is associated, and clusters are combined based on the pignistic properties. Experiments with real uncertain data show that our proposed method can be considered as a propitious tool

    Evidential Clustering: A Review

    Get PDF
    International audienceIn evidential clustering, uncertainty about the assignment of objects to clusters is represented by Dempster-Shafer mass functions. The resulting clustering structure, called a credal partition, is shown to be more general than hard, fuzzy, possibilistic and rough partitions, which are recovered as special cases. Three algorithms to generate a credal partition are reviewed. Each of these algorithms is shown to implement a decision-directed clustering strategy. Their relative merits are discussed

    Evidential Communities for Complex Networks

    Get PDF
    Community detection is of great importance for understand-ing graph structure in social networks. The communities in real-world networks are often overlapped, i.e. some nodes may be a member of multiple clusters. How to uncover the overlapping communities/clusters in a complex network is a general problem in data mining of network data sets. In this paper, a novel algorithm to identify overlapping communi-ties in complex networks by a combination of an evidential modularity function, a spectral mapping method and evidential c-means clustering is devised. Experimental results indicate that this detection approach can take advantage of the theory of belief functions, and preforms good both at detecting community structure and determining the appropri-ate number of clusters. Moreover, the credal partition obtained by the proposed method could give us a deeper insight into the graph structure

    A reliability-based approach for influence maximization using the evidence theory

    Get PDF
    The influence maximization is the problem of finding a set of social network users, called influencers, that can trigger a large cascade of propagation. Influencers are very beneficial to make a marketing campaign goes viral through social networks for example. In this paper, we propose an influence measure that combines many influence indicators. Besides, we consider the reliability of each influence indicator and we present a distance-based process that allows to estimate the reliability of each indicator. The proposed measure is defined under the framework of the theory of belief functions. Furthermore, the reliability-based influence measure is used with an influence maximization model to select a set of users that are able to maximize the influence in the network. Finally, we present a set of experiments on a dataset collected from Twitter. These experiments show the performance of the proposed solution in detecting social influencers with good quality.Comment: 14 pages, 8 figures, DaWak 2017 conferenc

    Evidential Bagging: Combining Heterogeneous Classifiers in the Belief Functions Framework

    Get PDF
    International audienceIn machine learning, Ensemble Learning methodologies are known to improve predictive accuracy and robustness. They consist in the learning of many classifiers that produce outputs which are finally combined according to different techniques. Bagging, or Bootstrap Aggre-gating, is one of the most famous Ensemble methodologies and is usually applied to the same classification base algorithm, i.e. the same type of classifier is learnt multiple times on bootstrapped versions of the initial learning dataset. In this paper, we propose a bagging methodology that involves different types of classifier. Classifiers' probabilist outputs are used to build mass functions which are further combined within the belief functions framework. Three different ways of building mass functions are proposed; preliminary experiments on benchmark datasets showing the relevancy of the approach are presented

    Application of Uncertainty Modeling Frameworks to Uncertain Isosurface Extraction

    Full text link
    Abstract. Proper characterization of uncertainty is a challenging task. Depend-ing on the sources of uncertainty, various uncertainty modeling frameworks have been proposed and studied in the uncertainty quantification literature. This pa-per applies various uncertainty modeling frameworks, namely possibility theory, Dempster-Shafer theory and probability theory to isosurface extraction from un-certain scalar fields. It proposes an uncertainty-based marching cubes template as an abstraction of the conventional marching cubes algorithm with a flexible uncertainty measure. The applicability of the template is demonstrated using 2D simulation data in weather forecasting and computational fluid dynamics and a synthetic 3D dataset

    Generalised max entropy classifiers

    Get PDF
    In this paper we propose a generalised maximum-entropy classification framework, in which the empirical expectation of the feature functions is bounded by the lower and upper expectations associated with the lower and upper probabilities associated with a belief measure. This generalised setting permits a more cautious appreciation of the information content of a training set. We analytically derive the KarushKuhn-Tucker conditions for the generalised max-entropy classifier in the case in which a Shannon-like entropy is adopted

    Reasoning with imprecise belief structures

    Full text link
    • …
    corecore